AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Instruction-following pretraining

# Instruction-following pretraining

Ahma 7B
Apache-2.0
Ahma-7B is a 7-billion-parameter decoder-only Transformer model based on Meta Llama(v1) architecture, fully pretrained from scratch using Finnish language.
Large Language Model Transformers Other
A
Finnish-NLP
201
8
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase